|
In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information. == Motivation == For simplicity, it will be assumed that all objects in the article are finite-dimensional. The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables ''p''(''x'', ''y''), the two marginal distributions are : The classical mutual information ''I''(''X'', ''Y'') is defined by : where ''S''(''q'') denotes the Shannon entropy of the probability distribution ''q''. One can calculate directly : : : : : So the mutual information is : But this is precisely the relative entropy between ''p''(''x'', ''y'') and ''p''(''x'')''p''(''y''). In other words, if we assume the two variables ''x'' and ''y'' to be uncorrelated, mutual information is the ''discrepancy in uncertainty'' resulting from this (possibly erroneous) assumption. It follows from the property of relative entropy that ''I''(''X'',''Y'') ≥ 0 and equality holds if and only if ''p''(''x'', ''y'') = ''p''(''x'')''p''(''y''). 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Quantum mutual information」の詳細全文を読む スポンサード リンク
|